|
The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If we pick a sample from a finite set ''A'' uniformly at random, the information revealed after we know the outcome is given by the Hartley function : If the base of the logarithm is 2, then the unit of uncertainty is the shannon. If it is the natural logarithm, then the unit is the nat. Hartley used a base-ten logarithm, and with this base, the unit of information is called the hartley in his honor. It is also known as the Hartley entropy. == Hartley function, Shannon's entropy, and Rényi entropy == The Hartley function coincides with the Shannon entropy (as well as with the Rényi entropies of all orders) in the case of a uniform probability distribution. It is actually a special case of the Rényi entropy since: : But it can also be viewed as a primitive construction, since, as emphasized by Kolmogorov and Rényi, the Hartley function can be defined without introducing any notions of probability (see ''Uncertainty and information'' by George J. Klir, p. 423). 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Hartley function」の詳細全文を読む スポンサード リンク
|